67 research outputs found

    Improvements to longitudinal Clean Development Mechanism sampling designs for lighting retrofit projects

    Get PDF
    An improved model for reducing the cost of long-term monitoring in Clean Development Mechanism (CDM) lighting retrofit projects is proposed. Cost-effective longitudinal sampling designs use the minimum numbers of meters required to report yearly savings at the 90% confidence and 10% relative precision level for duration of the project (up to 10 years) as stipulated by the CDM. Improvements to the existing model include a new non-linear Compact Fluorescent Lamp population decay model based on the Polish Efficient Lighting Project, and a cumulative sampling function modified to weight samples exponentially by recency. An economic model altering the cost function to a net present value calculation is also incorporated. The search space for such sampling models is investigated and found to be discontinuous and stepped, requiring a heuristic for optimisation; in this case the Genetic Algorithm was used. Assuming an exponential smoothing rate of 0.25, an inflation rate of 6.44%, and an interest rate of 10%, results show that sampling should be more evenly distributed over the study duration than is currently considered optimal, and that the proposed improvements in model accuracy increase monitoring costs by 21.4% in present value terms.Centre for New Energy Systems and the National Hub for the Postgraduate Programme in Energy Efficiency and Demand Side Management at the University of Pretoria.http://www.elsevier.com/locate/apenergyhb201

    Low-cost energy meter calibration method for measurement and verification

    Get PDF
    Energy meters need to be calibrated for use in Measurement and Verification (M&V) projects. However, calibration can be prohibitively expensive and a ect project feasibility negatively. This study presents a novel low-cost in-situ meter data calibration technique using a relatively low accuracy commercial energy meter as a calibrator. Calibration is achieved by combining two machine learning tools: the SIMulation EXtrapolation (SIMEX) Measurement Error Model and Bayesian regression. The model is trained or calibrated on half-hourly building energy data for 24 hours. Measurements are then compared to the true values over the following months to verify the method. Results show that the hybrid method significantly improves parameter estimates and goodness of fit when compared to Ordinary Least Squares regression or standard SIMEX. This study also addresses the e ect of mismeasurement in energy monitoring, and implements a powerful technique for mitigating the bias that arises because of it. Meters calibrated by the technique presented have adequate accuracy for most M&V applications, at a significantly lower cost.The National Research Foundation (NRF) and the National Hub for the Postgraduate Programme in Energy Efficiency and Demand Side Management.http://www.elsevier.com/locate/apenergy2018-02-28hb2017Electrical, Electronic and Computer EngineeringIndustrial and Systems Engineerin

    Efficient metering and surveying sampling designs in longitudinal measurement and verification for lighting retrofit

    Get PDF
    Measurement and Verification (M&V) is often required for energy efficiency or demand side management projects in buildings, to demonstrate that savings were in fact achieved. For projects where sampling has to be done, these costs can be the most significant driver of the overall M&V project cost, especially in multi-year (longitudinal) projects. This study presents a method for calculating efficient combined metering and survey sample designs for longitudinal M&V of retrofit projects. In this paper, a building lighting retrofit case study is considered. A Dynamic Linear Model (DLM) with Bayesian forecasting is used. The Bayesian component of the model determines the sample size-weighted uncertainty bounds on multi-year metering studies, with results from previous years incorporated into the overall calculation to reduce forecast uncertainty. The DLM is compared to previous meter sampling methods, and an investigation into the robustness of efficient sampling plans is also conducted. The Mellin Transform Moment Calculation method is then used to combine the DLM with a Dynamic Generalised Linear Model describing the uncertainty in survey results for the longitudinal monitoring of lamp population decay. A genetic algorithm is employed to optimise the combined sampling design. Besides the reliable uncertainty quantification features of the method, results show a reduction in sampling costs of 40% for simple random sampling, and approximately 26.6% for stratified sampling, as compared to realistic benchmark methods.The National Hub for the Postgraduate Programme in Energy Efficiency and Demand Side Management.http://www.elsevier.com/locate/enbuild2018-11-01hj2017Electrical, Electronic and Computer Engineerin

    Recommendations for the design of laboratory studies on non-target arthropods for risk assessment of genetically engineered plants

    Get PDF
    This paper provides recommendations on experimental design for early-tier laboratory studies used in risk assessments to evaluate potential adverse impacts of arthropod-resistant genetically engineered (GE) plants on non-target arthropods (NTAs). While we rely heavily on the currently used proteins from Bacillus thuringiensis (Bt) in this discussion, the concepts apply to other arthropod-active proteins. A risk may exist if the newly acquired trait of the GE plant has adverse effects on NTAs when they are exposed to the arthropod-active protein. Typically, the risk assessment follows a tiered approach that starts with laboratory studies under worst-case exposure conditions; such studies have a high ability to detect adverse effects on non-target species. Clear guidance on how such data are produced in laboratory studies assists the product developers and risk assessors. The studies should be reproducible and test clearly defined risk hypotheses. These properties contribute to the robustness of, and confidence in, environmental risk assessments for GE plants. Data from NTA studies, collected during the analysis phase of an environmental risk assessment, are critical to the outcome of the assessment and ultimately the decision taken by regulatory authorities on the release of a GE plant. Confidence in the results of early-tier laboratory studies is a precondition for the acceptance of data across regulatory jurisdictions and should encourage agencies to share useful information and thus avoid redundant testing

    Traffic and Related Self-Driven Many-Particle Systems

    Full text link
    Since the subject of traffic dynamics has captured the interest of physicists, many astonishing effects have been revealed and explained. Some of the questions now understood are the following: Why are vehicles sometimes stopped by so-called ``phantom traffic jams'', although they all like to drive fast? What are the mechanisms behind stop-and-go traffic? Why are there several different kinds of congestion, and how are they related? Why do most traffic jams occur considerably before the road capacity is reached? Can a temporary reduction of the traffic volume cause a lasting traffic jam? Under which conditions can speed limits speed up traffic? Why do pedestrians moving in opposite directions normally organize in lanes, while similar systems are ``freezing by heating''? Why do self-organizing systems tend to reach an optimal state? Why do panicking pedestrians produce dangerous deadlocks? All these questions have been answered by applying and extending methods from statistical physics and non-linear dynamics to self-driven many-particle systems. This review article on traffic introduces (i) empirically data, facts, and observations, (ii) the main approaches to pedestrian, highway, and city traffic, (iii) microscopic (particle-based), mesoscopic (gas-kinetic), and macroscopic (fluid-dynamic) models. Attention is also paid to the formulation of a micro-macro link, to aspects of universality, and to other unifying concepts like a general modelling framework for self-driven many-particle systems, including spin systems. Subjects such as the optimization of traffic flows and relations to biological or socio-economic systems such as bacterial colonies, flocks of birds, panics, and stock market dynamics are discussed as well.Comment: A shortened version of this article will appear in Reviews of Modern Physics, an extended one as a book. The 63 figures were omitted because of storage capacity. For related work see http://www.helbing.org

    A Bayesian approach to energy monitoring optimization

    Get PDF
    This thesis develops methods for reducing energy Measurement and Verification (M&V) costs through the use of Bayesian statistics. M&V quantifies the savings of energy efficiency and demand side projects by comparing the energy use in a given period to what that use would have been, had no interventions taken place. The case of a large-scale lighting retrofit study, where incandescent lamps are replaced by Compact Fluorescent Lamps (CFLs), is considered. These projects often need to be monitored over a number of years with a predetermined level of statistical rigour, making M&V very expensive. M&V lighting retrofit projects have two interrelated uncertainty components that need to be addressed, and which form the basis of this thesis. The first is the uncertainty in the annual energy use of the average lamp, and the second the persistence of the savings over multiple years, determined by the number of lamps that are still functioning in a given year. For longitudinal projects, the results from these two aspects need to be obtained for multiple years. This thesis addresses these problems by using the Bayesian statistical paradigm. Bayesian statistics is still relatively unknown in M&V, and presents an opportunity for increasing the efficiency of statistical analyses, especially for such projects. After a thorough literature review, especially of measurement uncertainty in M&V, and an introduction to Bayesian statistics for M&V, three methods are developed. These methods address the three types of uncertainty in M&V: measurement, sampling, and modelling. The first method is a low-cost energy meter calibration technique. The second method is a Dynamic Linear Model (DLM) with Bayesian Forecasting for determining the size of the metering sample that needs to be taken in a given year. The third method is a Dynamic Generalised Linear Model (DGLM) for determining the size of the population survival survey sample. It is often required by law that M&V energy meters be calibrated periodically by accredited laboratories. This can be expensive and inconvenient, especially if the facility needs to be shut down for meter installation or removal. Some jurisdictions also require meters to be calibrated in-situ; in their operating environments. However, it is shown that metering uncertainty makes a relatively small impact to overall M&V uncertainty in the presence of sampling, and therefore the costs of such laboratory calibration may outweigh the benefits. The proposed technique uses another commercial-grade meter (which also measures with error) to achieve this calibration in-situ. This is done by accounting for the mismeasurement effect through a mathematical technique called Simulation Extrapolation (SIMEX). The SIMEX result is refined using Bayesian statistics, and achieves acceptably low error rates and accurate parameter estimates. The second technique uses a DLM with Bayesian forecasting to quantify the uncertainty in metering only a sample of the total population of lighting circuits. A Genetic Algorithm (GA) is then applied to determine an efficient sampling plan. Bayesian statistics is especially useful in this case because it allows the results from previous years to inform the planning of future samples. It also allows for exact uncertainty quantification, where current confidence interval techniques do not always do so. Results show a cost reduction of up to 66%, but this depends on the costing scheme used. The study then explores the robustness of the efficient sampling plans to forecast error, and finds a 50% chance of undersampling for such plans, due to the standard M&V sampling formula which lacks statistical power. The third technique uses a DGLM in the same way as the DLM, except for population survival survey samples and persistence studies, not metering samples. Convolving the binomial survey result distributions inside a GA is problematic, and instead of Monte Carlo simulation, a relatively new technique called Mellin Transform Moment Calculation is applied to the problem. The technique is then expanded to model stratified sampling designs for heterogeneous populations. Results show a cost reduction of 17-40%, although this depends on the costing scheme used. Finally the DLM and DGLM are combined into an efficient overall M&V plan where metering and survey costs are traded off over multiple years, while still adhering to statistical precision constraints. This is done for simple random sampling and stratified designs. Monitoring costs are reduced by 26-40% for the costing scheme assumed. The results demonstrate the power and flexibility of Bayesian statistics for M&V applications, both in terms of exact uncertainty quantification, and by increasing the efficiency of the study and reducing monitoring costs.Hierdie proefskrif ontwikkel metodes waarmee die koste van energiemonitering en verifieëring (M&V) deur Bayesiese statistiek verlaag kan word. M&V bepaal die hoeveelheid besparings wat deur energiedoeltreffendheid- en vraagkantbestuurprojekte behaal kan word. Dit word gedoen deur die energieverbruik in ’n gegewe tydperk te vergelyk met wat dit sou wees indien geen ingryping plaasgevind het nie. ’n Grootskaalse beligtingsretrofitstudie, waar filamentgloeilampe met fluoresserende spaarlampe vervang word, dien as ’n gevallestudie. Sulke projekte moet gewoonlik oor baie jare met ’n vasgestelde statistiese akkuuraatheid gemonitor word, wat M&V duur kan maak. Twee verwante onsekerheidskomponente moet in M&V beligtingsprojekte aangespreek word, en vorm die grondslag van hierdie proefskrif. Ten eerste is daar die onsekerheid in jaarlikse energieverbruik van die gemiddelde lamp. Ten tweede is daar die volhoubaarheid van die besparings oor veelvoudige jare, wat bepaal word deur die aantal lampe wat tot in ’n gegewe jaar behoue bly. Vir longitudinale projekte moet hierdie twee komponente oor veelvoudige jare bepaal word. Hierdie proefskrif spreek die probleem deur middel van ’n Bayesiese paradigma aan. Bayesiese statistiek is nog relatief onbekend in M&V, en bied ’n geleentheid om die doeltreffendheid van statistiese analises te verhoog, veral vir bogenoemde projekte. Die proefskrif begin met ’n deeglike literatuurstudie, veral met betrekking tot metingsonsekerheid in M&V. Daarna word ’n inleiding tot Bayesiese statistiek vir M&V voorgehou, en drie metodes word ontwikkel. Hierdie metodes spreek die drie hoofbronne van onsekerheid in M&V aan: metings, opnames, en modellering. Die eerste metode is ’n laekoste energiemeterkalibrasietegniek. Die tweede metode is ’n Dinamiese Linieêre Model (DLM) met Bayesiese vooruitskatting, waarmee meter opnamegroottes bepaal kan word. Die derde metode is ’n Dinamiese Veralgemeende Linieêre Model (DVLM), waarmee bevolkingsoorlewing opnamegroottes bepaal kan word. Volgens wet moet M&V energiemeters gereeld deur erkende laboratoria gekalibreer word. Dit kan duur en ongerieflik wees, veral as die aanleg tydens meterverwydering en -installering afgeskakel moet word. Sommige regsgebiede vereis ook dat meters in-situ gekalibreer word; in hul bedryfsomgewings. Tog word dit aangetoon dat metingsonsekerheid ’n klein deel van die algehele M&V onsekerheid beslaan, veral wanneer opnames gedoen word. Dit bevraagteken die kostevoordeel van laboratoriumkalibrering. Die voorgestelde tegniek gebruik ’n ander kommersieële-akkuurraatheidsgraad meter (wat self ’n nie-weglaatbare metingsfout bevat), om die kalibrasie in-situ te behaal. Dit word gedoen deur die metingsfout deur SIMulerings EKStraptolering (SIMEKS) te verminder. Die SIMEKS resultaat word dan deur Bayesiese statistiek verbeter, en behaal aanvaarbare foutbereike en akkuurate parameterafskattings. Die tweede tegniek gebruik ’n DLM met Bayesiese vooruitskatting om die onsekerheid in die meting van die opnamemonster van die algehele bevolking af te skat. ’n Genetiese Algoritme (GA) word dan toegepas om doeltreffende opnamegroottes te vind. Bayesiese statistiek is veral nuttig in hierdie geval aangesien dit vorige jare se uitslae kan gebruik om huidige afskattings te belig Dit laat ook die presiese afskatting van onsekerheid toe, terwyl standaard vertrouensintervaltegnieke dit nie doen nie. Resultate toon ’n kostebesparing van tot 66%. Die studie ondersoek dan die standvastigheid van kostedoeltreffende opnameplanne in die teenwoordigheid van vooruitskattingsfoute. Dit word gevind dat kostedoeltreffende opnamegroottes 50% van die tyd te klein is, vanweë die gebrek aan statistiese krag in die standaard M&V formules. Die derde tegniek gebruik ’n DVLM op dieselfde manier as die DLM, behalwe dat bevolkingsoorlewingopnamegroottes ondersoek word. Die saamrol van binomiale opname-uitslae binne die GA skep ’n probleem, en in plaas van ’n Monte Carlo simulasie word die relatiewe nuwe Mellin Vervorming Moment Berekening op die probleem toegepas. Die tegniek word dan uitgebou om laagsgewyse opname-ontwerpe vir heterogene bevolkings te vind. Die uitslae wys ’n 17-40% kosteverlaging, alhoewel dit van die koste-skema afhang. Laastens word die DLM en DVLM saamgevoeg om ’n doeltreffende algehele M&V plan, waar meting en opnamekostes teen mekaar afgespeel word, te ontwerp. Dit word vir eenvoudige en laagsgewyse opname-ontwerpe gedoen. Moniteringskostes word met 26-40% verlaag, maar hang van die aangenome koste-skema af. Die uitslae bewys die krag en buigsaamheid van Bayesiese statistiek vir M&V toepassings, beide vir presiese onsekerheidskwantifisering, en deur die doeltreffendheid van die dataverbruik te verhoog en sodoende moniteringskostes te verlaag.Thesis (PhD)--University of Pretoria, 2017.National Research FoundationDepartment of Science and TechnologyNational Hub for the Postgraduate Programme in Energy Efficiency and Demand Side ManagementElectrical, Electronic and Computer EngineeringPhDUnrestricte

    Improvements to longitudinal clean development mechanism sampling designs for lighting retrofit projects

    Get PDF
    An improved model for reducing the cost of long-term monitoring in Clean Development Mechanism (CDM) lighting retrofit projects is proposed. Cost-effective longitudinal sampling designs use the minimum number of meters required to report yearly savings at the 90% confidence and 10% relative precision level for duration of the project (up to 10 years) as stipulated by the CDM. Improvements to the existing model include a new non-linear Compact Fluorescent Lamp population decay model based on the results of the Polish Efficient Lighting Project, and a cumulative sampling function modified to weight samples exponentially by recency. An economic model altering the cost function to a nett present value calculation is also incorporated. The search space for such sampling models are investigated and found to be discontinuous and stepped, requiring a heuristic for optimisation; in this case the Genetic Algorithm was used. Assuming an exponential smoothing rate of 0.25, an inflation rate of 6.44%, and an interest rate of 10%, results show that sampling should be more evenly distributed over the study duration than is currently considered optimal, and that the proposed improvements in model accuracy increase expected project costs in nett present value terms by approximately 20%. A sensitivity analysis reveals that the expected project cost is most sensitive to the reporting precision level, coefficient of variance, and reportingDissertation (MEng)--University of Pretoria, 2014.gm2014Electrical, Electronic and Computer EngineeringUnrestricte

    Bayesian Energy Measurement and Verification Analysis

    No full text
    Energy Measurement and Verification (M&V) aims to make inferences about the savings achieved in energy projects, given the data and other information at hand. Traditionally, a frequentist approach has been used to quantify these savings and their associated uncertainties. We demonstrate that the Bayesian paradigm is an intuitive, coherent, and powerful alternative framework within which M&V can be done. Its advantages and limitations are discussed, and two examples from the industry-standard International Performance Measurement and Verification Protocol (IPMVP) are solved using the framework. Bayesian analysis is shown to describe the problem more thoroughly and yield richer information and uncertainty quantification results than the standard methods while not sacrificing model simplicity. We also show that Bayesian methods can be more robust to outliers. Bayesian alternatives to standard M&V methods are listed, and examples from literature are cited
    corecore